perm filename KETONE[S80,JMC] blob sn#515651 filedate 1980-06-08 generic text, type C, neo UTF8
COMMENT ⊗   VALID 00003 PAGES
C REC  PAGE   DESCRIPTION
C00001 00001
C00002 00002	Comments on Ketonen's proposal for a proof checker
C00005 00003	Comments on RULES.EKL[EKL,JK]
C00011 ENDMK
C⊗;
Comments on Ketonen's proposal for a proof checker
ekl.doc[ekl,jk] and rules.ekl[ekl,jkl].

1. While arity doesn't matter much in the notation for the use
of functions, it is difficult to quantify over spaces of
functions where the arity isn't fixed.

2. The example given where a function takes arbitrary many arguments
of one sort followed by a single argument of another sort is
only one case of such a function, and the notation is inadequate
for the general case.  Thus there may be variable numbers of
arguments of each of two different sorts.  If this is excluded, then
various ways of combining functions to get new ones will have
to be severely limited.

Thus suppose f takes an argument of type A and g and argument of
type B.  Then we can combine them to get h defined by

h(x,y) = f(x)+g(y)

Now suppose that f and g each take a variable number of arguments.
h is definable except for the lack of punctuation to tell the
division points between the arguments to be submitted to f and
those to be submitted to g.

Unless you think about it a great deal more, you are better off
sticking to fixed arities.  I am sure I haven't thought of all
the difficulties, but I remember getting lost in this swamp some
years ago.

Another problem is how to write assertions that quantify over
all the arguments.

3. Why no 0-ary functions and predicates?  Is it a punctuation
question?

4. Is the normal mode, as in FOL, on-line interaction with each step
checked for validity?

page 5 - There may be questions about the preceding.  I have been skipping.

	I don't see the need for undeclarations any more than
for unassumptions.

page 11 - I am not sure what it means to delete the lines depending
on a declaration, because it must be possible to use a declaration
temporarily and then discharge it just as an assumption can be
discharged.

Comments on RULES.EKL[EKL,JK]

	One of the inconveniences of TAUT in FOL is that one
has to use ⊃i afterwards in order to eliminate the dependencies.
It would be better if TAUT would allow the specification of
dependencies to be eliminated and check the legitimacy of the
elimination.

	Definitions of functions and predicates should be allowed
and checked to see if they are admissible as definitions.  They
should be dependencies of formulas involving them, but when the
symbol disappears the dependency should disappear so that temporary
definitions can be used.  It should be possible to define the same
symbol redundantly, and complaint should be made only if two
contradictory definitions would appear in the same formula.

	Both declarations and definitions (which are really two
forms of the same thing should be treated like assumptions in
FOL; they should be carried as dependencies but be eliminable
as dependencies.

	The merging of proofs will be convenient only if means
exist to suppress some names used internally in proofs and to
change others.  Otherwise, proofs will have to be edited before
they can be combined.  The solution is to be able to refer to

	export(<proof>,a,...,c,(d,e),...(y,z))

which is the same as <proof> but with the symbols a,...,c visible
from the outside and the symbol d outside standing for what is
called e inside, etc.  All other symbols internal to <proof> are
suppressed.  Perhaps the reverse default should also be allowed.

All the operations to files should have the Algol-like syntax

<filename> ← <proofname>

so that it can be generalized to

<filename> ← export(proof1,a,b) ∪ proof2;

In other words, proof-valued expressions should be treated like
other expressions.  Algol-like assignments and composition of
functions should be used whereever this is reasonable instead
of special syntactic forms.  Internally, this would be some
analog of (SETQ ...).

	More generally, I would like to go as far as possible toward
realizing the following idea, which I think is required for moving
smoothly between what one does "by hand" and what one writes
programs to do:

	1. All operations have a long form that is either an assignment
or the evaluation of a functional expression.  Short forms are obtained
from the long form by modes that allow defaults.  The mode itself is
a variable which is changed by assigning to it.  Consider the FOL operation
CANCEL that cancels a step.  This should be regarded as an abbreviation of
something analogous to

	proof ← cdr(proof);

and the latter form should be available for inclusion in programs.  Every
operation should have such a long form.

	I'm not sure this can be worked out, and it isn't relevant to
a proof checker, but even an editor might be regarded that way.  Thus
typing A to an editor might be considered as an abbreviation
of a statement analogous to

	(setq buffer (cons (quote A) buffer)).

More to the point for proof checking, the FOL

	∧i 37 45;

should be regarded as an abbreviation for

	(setq proof (∧i proof 37 45))

or perhaps

	(setq proof (cons (∧i proof 37 45) proof)).

Expressions like

	(⊃i (∧i proof 37 45) 52)

would then have an evident interpretation.

	Various side-effects may make a simple-minded approach of
this kind impractical, but commands and operators should be regarded
as assignments of functional expressions as much as possible.